Finite Connectivity Attractor Neural Networks

نویسنده

  • B Wemmenhove
چکیده

Abstract. We study a family of diluted attractor neural networks with a finite average number of (symmetric) connections per neuron. As in finite connectivity spin glasses, their equilibrium properties are described by order parameter functions, for which we derive an integral equation in replica symmetric (RS) approximation. A bifurcation analysis of this equation reveals the locations of the paramagnetic to recall and paramagnetic to spin-glass transition lines in the phase diagram. The line separating the retrieval phase from the spin-glass phase is calculated at zero temperature. All phase transitions are found to be continuous.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Attractor neural networks with patchy connectivity

We investigate the effects of patchy (clustered) connectivity in sparsely connected attractor neural networks (NNs). This study is motivated by the fact that the connectivity of pyramidal neurons in layer II/III of the mammalian visual cortex is patchy and sparse. The storage capacity of hypercolumnar attractor NNs that use the Hopfield and Willshaw learning rules with this kind of connectivity...

متن کامل

Classifiers with limited connectivity

For many neural network models that are based on perceptrons, the number of activity patterns that can be classified is limited by the number of plastic connections that each neuron receives, even when the total number of neurons is much larger. This poses the problem of how the biological brain can take advantage of its huge number of neurons given that the connectivity is extremely sparse, es...

متن کامل

Learning Vehicle Traffic Videos using Small-World Attractor Neural Networks

The goal of this work is to learn and retrieve a sequence of highly correlated patterns using a Hopfield-type of Attractor Neural Network (ANN) with a small-world connectivity distribution. For this model, we propose a weight learning heuristic which combines the pseudo-inverse approach with a row-shifting schema. The influence of the ratio of random connectivity on retrieval quality and learni...

متن کامل

Slowly evolving random graphs II: Adaptive geometry in finite-connectivity Hopfield models

Abstract. We present an analytically solvable random graph model in which the connections between the nodes can evolve in time, adiabatically slowly compared to the dynamics of the nodes. We apply the formalism to finite connectivity attractor neural network (Hopfield) models and we show that due to the minimisation of the frustration effects the retrieval region of the phase diagram can be sig...

متن کامل

Design of Continuous Attractor Networks with Monotonic Tuning Using a Symmetry Principle

Neurons that sustain elevated firing in the absence of stimuli have been found in many neural systems. In graded persistent activity, neurons can sustain firing at many levels, suggesting a widely found type of network dynamics in which networks can relax to any one of a continuum of stationary states. The reproduction of these findings in model networks of nonlinear neurons has turned out to b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003